- Home
- Search Results
- Page 1 of 1
Search for: All records
-
Total Resources4
- Resource Type
-
0003000001000000
- More
- Availability
-
40
- Author / Contributor
- Filter by Author / Creator
-
-
Bhattacharya, Sagnik (4)
-
Narayan, Prakash (4)
-
#Tyler Phillips, Kenneth E. (0)
-
#Willis, Ciara (0)
-
& Abreu-Ramos, E. D. (0)
-
& Abramson, C. I. (0)
-
& Abreu-Ramos, E. D. (0)
-
& Adams, S.G. (0)
-
& Ahmed, K. (0)
-
& Ahmed, Khadija. (0)
-
& Aina, D.K. Jr. (0)
-
& Akcil-Okan, O. (0)
-
& Akuom, D. (0)
-
& Aleven, V. (0)
-
& Andrews-Larson, C. (0)
-
& Archibald, J. (0)
-
& Arnett, N. (0)
-
& Arya, G. (0)
-
& Attari, S. Z. (0)
-
& Ayala, O. (0)
-
- Filter by Editor
-
-
Technical Program Committee, 2021 IEEE (1)
-
Veeravalli, V. V. (1)
-
& Spizer, S. M. (0)
-
& . Spizer, S. (0)
-
& Ahn, J. (0)
-
& Bateiha, S. (0)
-
& Bosch, N. (0)
-
& Brennan K. (0)
-
& Brennan, K. (0)
-
& Chen, B. (0)
-
& Chen, Bodong (0)
-
& Drown, S. (0)
-
& Ferretti, F. (0)
-
& Higgins, A. (0)
-
& J. Peters (0)
-
& Kali, Y. (0)
-
& Ruiz-Arias, P.M. (0)
-
& S. Spitzer (0)
-
& Sahin. I. (0)
-
& Spitzer, S. (0)
-
-
Have feedback or suggestions for a way to improve these results?
!
Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Veeravalli, V. V. (Ed.)Abstract—Shared information is a measure of mutual de- pendence among multiple jointly distributed random variables with finite alphabets. For a Markov chain on a tree with a given joint distribution, we give a new proof of an explicit characterization of shared information. The Markov chain on a tree is shown to possess a global Markov property based on graph separation; this property plays a key role in our proofs. When the underlying joint distribution is not known, we exploit the special form of this characterization to provide a multiarmed bandit algorithm for estimating shared information, and analyze its error performance.more » « less
-
Bhattacharya, Sagnik; Narayan, Prakash (, Proceedings of the 2023 IEEE Symposium on Information Theory)
-
Bhattacharya, Sagnik; Narayan, Prakash (, Proceedings of the 2022 IEEE Symposium on Information Theory)
-
Bhattacharya, Sagnik; Narayan, Prakash (, 2021 IEEE International Symposium on Information Theory)Technical Program Committee, 2021 IEEE (Ed.)Consider a finite set of multiple sources, described by a random variable with m components. Only k ≤ m source components are sampled and jointly compressed in order to reconstruct all the m components under an excess distortion criterion. Sampling can be that of a fixed subset A with |A| = k or randomized over all subsets of size k. In the case of random sampling, the sampler may or may not be aware of the m source components. The compression code consists of an encoder whose input is the realization of the sampler and the sampled source components; the decoder input is solely the encoder output. The combined sampling mechanism and rate distortion code are universal in that they must be devised without exact knowledge of the prevailing source probability distribution. In a Bayesian setting, considering coordinated single-shot sampling and compression, our contributions involve achievability results for the cases of fixed-set, source-independent and source-dependent random sampling.more » « less
An official website of the United States government
